- Can't afford the buzzy battery toaster from CES? Try these alternatives instead
- How AI will transform cybersecurity in 2025 - and supercharge cybercrime
- Tripwire Patch Priority Index for December 2024
- The fastest growing jobs in the AI-powered economy
- Fake PoC Exploit Targets Security Researchers with Infostealer
Right-sizing artificial intelligence: The overlooked key to more sustainable technology
This is a co-authored blog from Professor Aleksandra Przegalińska and Denise Lee
As artificial intelligence (AI) moves from the hypothetical to the real world of practical applications, it’s becoming clear that bigger is not always better.
Recent experiences in AI development and deployment have shed light on the power of tailored, ‘proportional’ approaches. While the pursuit of ever-larger models and more powerful systems has been a common trend, the AI community is increasingly recognizing the value of right-sized solutions. These more focused and efficient approaches are proving remarkably successful in developing sustainable AI models that not only reduce resource consumption but also lead to better outcomes.
By prioritizing proportionality, developers have the potential to create AI systems that are more adaptable, cost-effective, and environmentally friendly, without sacrificing performance or capability. This shift in perspective is driving innovation in ways that align technological advancement with sustainability goals, demonstrating that ‘smarter’ often trumps ‘bigger’ in the realm of AI development. This realization is prompting a reevaluation of our fundamental assumptions about AI progress – one that considers not just the raw capabilities of AI systems but also their efficiency, scalability, and environmental impact.
From our vantage points in academia (Aleksandra) and business (Denise), we have observed a critical question emerge that demands considerable reflection: How can we harness AI’s incredible potential in a sustainable way? The answer lies in a principle that’s deceptively simple yet maddeningly overlooked: proportionality.
The computational resources required to train and operate generative AI models are substantial. To put this in perspective, consider the following data: Researchers estimated that training a single large language model can consume around 1,287 MWh of electricity and emit 552 tons of carbon dioxide equivalent.[1] This is comparable to the energy consumption of an average American household over 120 years.[2]
Researchers also estimate that by 2027, the electricity demand for AI could range from 85 to 134 TWh annually.[3] To contextualize this figure, it surpasses the yearly electricity consumption of countries like the Netherlands (108.5 TWh in 2020) or Sweden (124.4 TWh in 2020).[4]
While these figures are significant, it’s crucial to consider them in the context of AI’s broader potential. AI systems, despite their energy requirements, have the capacity to drive efficiencies across various sectors of the technology landscape and beyond.
For instance, AI-optimized cloud computing services have shown the potential to reduce energy consumption by up to 30% in data centers.[5] In software development, AI-powered code completion tools can significantly reduce the time and computational resources needed for programming tasks, potentially saving millions of CPU hours annually across the industry.[6]
Still, striking the balance between AI’s need for energy and its potential for driving efficiency is exactly where proportionality comes in. It’s about right-sizing our AI solutions. Using a scalpel instead of a chainsaw. Opting for a nimble electric scooter when a gas-guzzling SUV is overkill.
We’re not suggesting we abandon cutting-edge AI research. Far from it. But we can be smarter about how and when we deploy these powerful tools. In many cases, a smaller, specialized model can do the job just as well – and with a fraction of the environmental impact.[7] It’s really about smart business. Efficiency. Sustainability.
However, moving to a proportional mindset can be challenging. It requires a level of AI literacy that many organizations are still grappling with. It calls for a robust interdisciplinary dialogue between technical experts, business strategists, and sustainability specialists. Such collaboration is essential for developing and implementing truly intelligent and efficient AI strategies.
These strategies will prioritize intelligence in design, efficiency in execution, and sustainability in practice. The role of energy-efficient hardware and networking in data center modernization cannot be overstated.
By leveraging state-of-the-art, power-optimized processors and high-efficiency networking equipment, organizations can significantly reduce the energy footprint of their AI workloads. Furthermore, implementing comprehensive energy visibility systems provides invaluable insights into the emissions impact of AI operations. This data-driven approach enables companies to make informed decisions about resource allocation, identify areas for improvement, and accurately measure the environmental impact of their AI initiatives. As a result, organizations can not only reduce costs but also demonstrate tangible progress toward their sustainability goals.
Paradoxically, the most impactful and judicious application of AI might often be one that utilizes less computational resources, thereby optimizing both performance and environmental considerations. By combining proportional AI development with cutting-edge, energy-efficient infrastructure and robust energy monitoring, we can create a more sustainable and responsible AI ecosystem.
The solutions we create will not come from a single source. As our collaboration has taught us, academia and business have much to learn from each other. AI that scales responsibly will be the product of many people working together on ethical frameworks, integrating diverse perspectives, and committing to transparency.
Let’s make AI work for us.
[1] Patterson, D., Gonzalez, J., Le, Q., Liang, C., Munguia, L.-M., Rothchild, D., So, D., Texier, M., & Dean, J. (2021). Carbon emissions and large neural network training. arXiv.
[2] Mehta, S. (2024, July 4). How much energy do llms consume? Unveiling the power behind AI. Association of Data Scientists.
[3] de Vries, A. (2023). The growing energy footprint of Artificial Intelligence. Joule, 7(10), 2191–2194. doi:10.1016/j.joule.2023.09.004
[4] de Vries, A. (2023). The growing energy footprint of Artificial Intelligence. Joule, 7(10), 2191–2194. doi:10.1016/j.joule.2023.09.004
[5] Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and policy considerations for Deep Learning in NLP. 1 Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. doi:10.18653/v1/p19-1355
[6] Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and policy considerations for Deep Learning in NLP. 1 Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. doi:10.18653/v1/p19-1355
[7] CottGroup. (2024). Smaller and more efficient artificial intelligence models: Cottgroup.
Share: